-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Auto Parallel] fix bugs for split_batches_for_accumulation && fix bu… #9217
[Auto Parallel] fix bugs for split_batches_for_accumulation && fix bu… #9217
Conversation
Thanks for your contribution! |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #9217 +/- ##
===========================================
- Coverage 53.11% 52.74% -0.38%
===========================================
Files 665 661 -4
Lines 109041 107375 -1666
===========================================
- Hits 57918 56634 -1284
+ Misses 51123 50741 -382 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
fc375a4
to
225b3a7
Compare
…gs for enable_delay_scale_loss
225b3a7
to
513086c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
8981571
to
02f6d7d
Compare
cadd4d4
to
f286f90
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PaddlePaddle#9217) * [Auto Parallel] fix bugs for split_batches_for_accumulation && fix bugs for enable_delay_scale_loss * add enable_delay_scale_loss flag for auto_parallel * fix ut * Update ci_case_auto.sh
…gs for enable_delay_scale_loss
PR types
Bug fixes
PR changes
Others
Description
A.修复动态图自动并行下,split_batches_for_accumulation与动手无法对齐的情况。如图
B.修复动态图自动并行下,enable_delay_scale_loss逻辑错误的问题。自动并行默认实现enable_delay_scale_loss,预期行为为:
但当前动态图自动并行的行为为:
此外,由于enable_delay_scale_loss逻辑不完善,为自动并行增加开关
C. 修复静态图自动并行下,loss打印展示问题。loss预期的展示行为为:
但当前静态图自动并行loss对打印展示行为为:
D. 修复函数名称错误,将traning修正为training